For this assignment, we are going to investigate serious incidents requiring the fire department to respond. Using data about the locations of firehouses and fires occurring in New York City, we want to know whether response times to fires differ across the city. Second, we will try to focus on one possible variable that could affect response times – the distance from the firehouse – and see whether we find the (expected) effect.
To keep this homework manageable, I am leaving out another part of the investigation: What is the effect of demographic and/or income characteristics of the neighborhood on response times. This is likely a bit more sensitive but also relevant from a public policy perspective.
We rely on two data sets.
NYC Open Data has data on all incidents responded to by fire companies. I have included the variable description file in the exercise folder. The following variables are available:
This dataset is only update annually, and thus far only data from 2013 to 2015 is contained. The full dataset is also somewhat too large for an exercise (1.3M rows), so I suggest to limit yourself to a subset. I have added a file containing the subset of the most severe incidents (Level 7 - all hands) for 2015 only which yields 2,335 incidents.
library(tidyverse)
fire_all <- read_csv("no_upload/Incidents_Responded_to_by_Fire_Companies.csv")
fire_all$year <- substr(fire_all$INCIDENT_DATE_TIME, 7, 10)
fire <- fire_all%>%
filter(HIGHEST_LEVEL_DESC == "7 - Signal 7-5") %>%
filter(year==2015)
Unfortunately, the addresses of the incidents were not geocoded yet. Ideally, I would like you to know how to do this but am mindful about the hour or so required to get this done. So, here is the code. The geocodes (as far as they were returned successfully) are part of the data.
library(ggmap)
# Make list of addresses
address <- str_c( str_to_title(fire$STREET_HIGHWAY),
"New York, NY",
fire$ZIP_CODE,
sep=", ")
# Register Google API Key
register_google(key = Sys.getenv("GOOGLE_MAPS_API_KEY"))
# Geocode Addresses
latlong <- geocode(address, output = c("latlon"))
# Merge on
fire$Latitude <- latlong$lat
fire$Longitude <- latlong$lon
# Save File
write_csv(fire, "severe_incidents.csv")
NYC Open Data also provides data on the location of all 218 firehouses in NYC. Relevant for our analysis are the following variables:
FacilityName, Borough, Latitude, Longitude
library(geojsonR)
library(geosphere)
library(ggplot2)
library(ggthemes)
library(htmltools)
##
## Attaching package: 'htmltools'
## The following object is masked from 'package:geosphere':
##
## span
library(htmlwidgets)
library(leaflet)
library(leaflet.extras)
library(lubridate)
##
## Attaching package: 'lubridate'
## The following object is masked from 'package:base':
##
## date
library(rgdal)
## Loading required package: sp
## rgdal: version: 1.2-13, (SVN revision 686)
## Geospatial Data Abstraction Library extensions to R successfully loaded
## Loaded GDAL runtime: GDAL 2.1.2, released 2016/10/24
## Path to GDAL shared files: /Users/tmbrunil/Library/R/3.3/library/rgdal/gdal
## Loaded PROJ.4 runtime: Rel. 4.9.1, 04 March 2015, [PJ_VERSION: 491]
## Path to PROJ.4 shared files: /Users/tmbrunil/Library/R/3.3/library/rgdal/proj
## Linking to sp version: 1.2-5
library(rgeos)
## rgeos version: 0.3-25, (SVN revision 555)
## GEOS runtime version: 3.4.2-CAPI-1.8.2 r3921
## Linking to sp version: 1.2-5
## Polygon checking: TRUE
library(sf)
## Linking to GEOS 3.4.2, GDAL 2.1.2, proj.4 4.9.1
library(sp)
library(stringr)
library(tibble)
library(tidyverse)
## Loading tidyverse: tidyr
## Loading tidyverse: readr
## Loading tidyverse: purrr
## Loading tidyverse: dplyr
## Conflicts with tidy packages ----------------------------------------------
## as.difftime(): lubridate, base
## date(): lubridate, base
## filter(): dplyr, stats
## intersect(): lubridate, rgeos, base
## lag(): dplyr, stats
## setdiff(): lubridate, rgeos, base
## union(): lubridate, rgeos, base
severe <- read_csv("severe_incidents.csv")
## Parsed with column specification:
## cols(
## .default = col_character(),
## IM_INCIDENT_KEY = col_integer(),
## UNITS_ONSCENE = col_integer(),
## TOTAL_INCIDENT_DURATION = col_integer(),
## ZIP_CODE = col_integer(),
## FIRE_ORIGIN_BELOW_GRADE_FLAG = col_integer(),
## STORY_FIRE_ORIGIN_COUNT = col_integer(),
## STANDPIPE_SYS_PRESENT_FLAG = col_integer(),
## Latitude = col_double(),
## Longitude = col_double(),
## year = col_integer()
## )
## See spec(...) for full column specifications.
listings <- read.csv("FDNY_Firehouse_Listing.csv")
listings <- listings %>%
select(FacilityName, Borough, Latitude, Longitude)
severe <- severe %>%
filter(!is.na(Longitude))
coordinates(severe) <- ~Longitude + Latitude
boroughs <- readOGR("nyc_boro.geojson")
## OGR data source with driver: GeoJSON
## Source: "nyc_boro.geojson", layer: "OGRGeoJSON"
## with 5 features
## It has 5 fields
proj4string(severe) <- proj4string(boroughs)
severe <- severe[boroughs,]
Provide a leaflet map of the severe fires contained in the file severe_incidents.csv. Ignore locations that fall outside the five boroughs of New York City. Provide at least three pieces of information on the incident in a popup.
map_severe_fires <- leaflet(severe) %>%
setView(lng = -74.0060, lat = 40.7128, zoom = 11) %>%
addProviderTiles(providers$CartoDB.PositronNoLabels) %>%
addCircles(color = "red",
label = ~htmlEscape(as.character(INCIDENT_TYPE_DESC)),
popup = paste("Incident key:",severe$IM_INCIDENT_KEY,"<br/>",
"Borough:", severe$BOROUGH_DESC,"<br/>",
"Incident date and time:", severe$INCIDENT_DATE_TIME))
map_severe_fires
Start with the previous map. Now, distinguish the markers of the fire locations by PROPERTY_USE_DESC, i.e. what kind of property was affected. If there are too many categories, collapse some categories. Choose an appropriate coloring scheme to map the locations by type of affected property. Add a legend informing the user about the color scheme. Also make sure that the information about the type of affected property is now contained in the popup information. Show this map.
Collapsing some of the categories below:
sort(unique(severe$PROPERTY_USE_DESC))
## [1] "000 - Property Use, other"
## [2] "112 - Billiard center, pool hall"
## [3] "114 - Ice rink: indoor, outdoor"
## [4] "121 - Ballroom, gymnasium"
## [5] "124 - Playground"
## [6] "130 - Places of worship, funeral parlors, other"
## [7] "131 - Church, mosque, synagogue, temple, chapel"
## [8] "140 - Clubs, other"
## [9] "142 - Clubhouse"
## [10] "143 - Yacht Club"
## [11] "150 - Public or government, other"
## [12] "160 - Eating, drinking places, other"
## [13] "161 - Restaurant or cafeteria"
## [14] "162 - Bar or nightclub"
## [15] "173 - Bus station"
## [16] "174 - Rapid transit station"
## [17] "180 - Studio/theater, other"
## [18] "182 - Auditorium, concert hall"
## [19] "210 - Schools, non-adult, other"
## [20] "211 - Preschool"
## [21] "213 - Elementary school, including kindergarten"
## [22] "215 - High school/junior high school/middle school"
## [23] "241 - Adult education center, college classroom"
## [24] "311 - 24-hour care Nursing homes, 4 or more persons"
## [25] "322 - Alcohol or substance abuse recovery center"
## [26] "331 - Hospital - medical or psychiatric"
## [27] "340 - Clinics, doctors offices, hemodialysis cntr, other"
## [28] "342 - Doctor, dentist or oral surgeon office"
## [29] "363 - Reformatory, juvenile detention center"
## [30] "365 - Police station"
## [31] "400 - Residential, other"
## [32] "419 - 1 or 2 family dwelling"
## [33] "429 - Multifamily dwelling"
## [34] "439 - Boarding/rooming house, residential hotels"
## [35] "449 - Hotel/motel, commercial"
## [36] "459 - Residential board and care"
## [37] "460 - Dormitory-type residence, other"
## [38] "464 - Barracks, dormitory"
## [39] "500 - Mercantile, business, other"
## [40] "511 - Convenience store"
## [41] "519 - Food and beverage sales, grocery store"
## [42] "549 - Specialty shop"
## [43] "564 - Laundry, dry cleaning"
## [44] "569 - Professional supplies, services"
## [45] "571 - Service station, gas station"
## [46] "579 - Motor vehicle or boat sales, services, repair"
## [47] "580 - General retail, other"
## [48] "581 - Department or discount store"
## [49] "592 - Bank"
## [50] "596 - Post office or mailing firms"
## [51] "599 - Business office"
## [52] "629 - Laboratory or science lababoratory"
## [53] "635 - Computer center"
## [54] "640 - Utility or Distribution system, other"
## [55] "642 - Electrical distribution"
## [56] "648 - Sanitation utility"
## [57] "669 - Forest, timberland, woodland"
## [58] "700 - Manufacturing, processing"
## [59] "800 - Storage, other"
## [60] "807 - Outside material storage area"
## [61] "808 - Outbuilding or shed"
## [62] "839 - Refrigerated storage"
## [63] "880 - Vehicle storage, other"
## [64] "881 - Parking garage, (detached residential garage)"
## [65] "882 - Parking garage, general vehicle"
## [66] "891 - Warehouse"
## [67] "898 - Dock, marina, pier, wharf"
## [68] "899 - Residential or self-storage units"
## [69] "900 - Outside or special property, other"
## [70] "922 - Tunnel"
## [71] "931 - Open land or field"
## [72] "937 - Beach"
## [73] "940 - Water area, other"
## [74] "946 - Lake, river, stream"
## [75] "951 - Railroad right-of-way"
## [76] "952 - Railroad yard"
## [77] "960 - Street, other"
## [78] "961 - Highway or divided highway"
## [79] "962 - Residential street, road or residential driveway"
## [80] "963 - Street or road in commercial area"
## [81] "965 - Vehicle parking area"
## [82] "974 - Aircraft loading area"
## [83] "981 - Construction site"
## [84] "984 - Industrial plant yard - area"
## [85] "NNN - None"
## [86] "UUU - Undetermined"
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("NNN|UUU|000", severe$PROPERTY_USE_DESC, value = TRUE), "Other, none or unknown", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^12", severe$PROPERTY_USE_DESC, value = TRUE), "Bar, restaurant, concert hall, theater, playground", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^13", severe$PROPERTY_USE_DESC, value = TRUE), "School, college, place of worship, public", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^17", severe$PROPERTY_USE_DESC, value = TRUE), "Transit station", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^15", severe$PROPERTY_USE_DESC, value = TRUE), "School, college, place of worship, public", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^1", severe$PROPERTY_USE_DESC, value = TRUE), "Bar, restaurant, concert hall, theater, playground", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^2", severe$PROPERTY_USE_DESC, value = TRUE), "School, college, place of worship, public", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^3", severe$PROPERTY_USE_DESC, value = TRUE), "Health or law enforcement", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^4", severe$PROPERTY_USE_DESC, value = TRUE), "Residential or hotel", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^59", severe$PROPERTY_USE_DESC, value = TRUE), "Bank, post, office", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^5", severe$PROPERTY_USE_DESC, value = TRUE), "Store, shop, service", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^62|^63|^64", severe$PROPERTY_USE_DESC, value = TRUE), "Utility, laboratory, computer center", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^7", severe$PROPERTY_USE_DESC, value = TRUE), "Manufacturing, storage facility", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^8", severe$PROPERTY_USE_DESC, value = TRUE), "Manufacturing, storage facility", severe$PROPERTY_USE_DESC)
severe$PROPERTY_USE_DESC <- ifelse(severe$PROPERTY_USE_DESC %in% grep("^9|669", severe$PROPERTY_USE_DESC, value = TRUE), "Outside area", severe$PROPERTY_USE_DESC)
sort(unique(severe$PROPERTY_USE_DESC))
## [1] "Bank, post, office"
## [2] "Bar, restaurant, concert hall, theater, playground"
## [3] "Health or law enforcement"
## [4] "Manufacturing, storage facility"
## [5] "Other, none or unknown"
## [6] "Outside area"
## [7] "Residential or hotel"
## [8] "School, college, place of worship, public"
## [9] "Store, shop, service"
## [10] "Transit station"
## [11] "Utility, laboratory, computer center"
count(severe@data, PROPERTY_USE_DESC)
## # A tibble: 11 x 2
## PROPERTY_USE_DESC n
## <chr> <int>
## 1 Bank, post, office 53
## 2 Bar, restaurant, concert hall, theater, playground 66
## 3 Health or law enforcement 17
## 4 Manufacturing, storage facility 62
## 5 Other, none or unknown 24
## 6 Outside area 184
## 7 Residential or hotel 1690
## 8 School, college, place of worship, public 27
## 9 Store, shop, service 166
## 10 Transit station 13
## 11 Utility, laboratory, computer center 8
pal = colorFactor("Set3", domain = severe$PROPERTY_USE_DESC)
color_prop_use = pal(severe$PROPERTY_USE_DESC)
map_severe_fires_type <- leaflet(severe) %>%
setView(lng = -74.0060, lat = 40.7128, zoom = 11) %>%
addProviderTiles(providers$CartoDB.PositronNoLabels) %>%
addCircles(color = color_prop_use,
label = ~htmlEscape(as.character(PROPERTY_USE_DESC)),
popup = paste("Property use:",severe$PROPERTY_USE_DESC,"<br/>",
"Borough:", severe$BOROUGH_DESC,"<br/>",
"Date and time:", severe$INCIDENT_DATE_TIME)) %>%
addLegend(pal = pal, values = ~severe$PROPERTY_USE_DESC, title = "Property use")
map_severe_fires_type
Add marker clustering, so that zooming in will reveal the individual locations but the zoomed out map only shows the clusters. Show the map with clusters.
map_severe_fires_cluster <- leaflet(severe) %>%
setView(lng = -74.0060, lat = 40.7128, zoom = 11) %>%
addProviderTiles(providers$CartoDB.PositronNoLabels) %>%
addCircleMarkers(color = color_prop_use,
label = ~htmlEscape(as.character(PROPERTY_USE_DESC)),
clusterOptions = markerClusterOptions(),
popup = paste("Property use:",severe$PROPERTY_USE_DESC,"<br/>",
"Borough:", severe$BOROUGH_DESC,"<br/>",
"Date and time:", severe$INCIDENT_DATE_TIME))
map_severe_fires_cluster
The second data file contains the locations of the 218 firehouses in New York City. Start with the non-clustered map (2b) and now adjust the size of the circle markers by severity (TOTAL_INCIDENT_DURATION or UNITS_ONSCENE seem plausible options). More severe incidents should have larger circles on the map. On the map, also add the locations of the fire houses. Add two layers (“Incidents”, “Firehouses”) that allow the user to select which information to show.
map_severe_fires_firehouses <- leaflet(severe) %>%
setView(lng = -74.0060, lat = 40.7128, zoom = 11) %>%
addProviderTiles(providers$CartoDB.PositronNoLabels) %>%
addCircleMarkers(data = listings,
lng = ~as.numeric(Longitude),
lat = ~as.numeric(Latitude),
label = ~htmlEscape(as.character(Borough)),
color = "black",
radius = 5,
fill = TRUE,
group = "Firehouses") %>%
addCircleMarkers(color = color_prop_use,
radius = ~UNITS_ONSCENE/5,
label = ~htmlEscape(as.character(PROPERTY_USE_DESC)),
fill = TRUE,
popup = paste("Property use:", severe$PROPERTY_USE_DESC,"<br/>",
"Borough:", severe$BOROUGH_DESC,"<br/>",
"Date and time:",severe$INCIDENT_DATE_TIME),
group = "Incidents") %>%
addLegend(pal = pal, values = ~severe$PROPERTY_USE_DESC, title = "Property use") %>%
addLayersControl(
baseGroups = c("Incidents", "Firehouses"),
options = layersControlOptions(collapsed = FALSE)
)
## Warning in validateCoords(lng, lat, funcName): Data contains 5 rows with
## either missing or invalid lat/lon values and will be ignored
map_severe_fires_firehouses
We now want to investigate whether the distance of the incident from the nearest firehouse varies across the city.
For all incident locations, identify the nearest firehouse and calculate the distance between the firehouse and the incident location. Provide a scatter plot showing the time until the first engine arrived (the variables INCIDENT_DATE_TIME and ARRIVAL_DATE_TIME) will be helpful. If there are any interesting patterns to highlight, feel free to do so.
I started with the basic scatterplot, that showed that there wasn’t much of a pattern: on the contrary, there seems to be almost no connection between reaction time and distance to firestation. Next, I use sizes and colours of point to highlight incidents that were distant or took a long time to reach. Again, no clear pattern emerged. Finally, I used the zip code information for the fires together with a shapefile of New York zip code areas to see if there are any patterns by location. This time it seemed like some patterns emerged, as especially neighbourhoods that were in the periphery or the very center of the city (Manhattan) has unusually long reaction times.
listings <- filter(listings, !is.na(Latitude))
distances <- distm(severe@coords[,c("Longitude", "Latitude")], listings[, c("Longitude", "Latitude")], fun = distVincentyEllipsoid)
min_indexes <- distances %>%
as.data.frame() %>%
apply(1, which.min)
min_distances <- distances %>%
as.data.frame() %>%
apply(1, min)
severe$closet_station_index <- min_indexes
severe$station_distance <- min_distances
reaction_time <- (parse_date_time(severe$ARRIVAL_DATE_TIME, '%m/%d/%Y %I:%M:%S %p') -
parse_date_time(severe$INCIDENT_DATE_TIME, '%m/%d/%Y %I:%M:%S %p'))
severe$reaction_time <- duration(as.double(reaction_time), "seconds")
## Initial approach to get distance matrix below. For-loop took too long to run, so I vectorized the function per a recommendation from professor Brambor.
# listings$points <- paste(listings$Longitude, listings$Latitude, sep = ", ")
# severe@data$points <- paste(severe@coords[,1], severe@coords[,2], sep = ", ")
#distances <- matrix(data = NA, nrow = length(listings$points), ncol = length(severe@data$points))
#for(i in 1:length(listings$points)) {
# point1 <- as.double(unlist(str_extract_all(listings$points[i], #"[[:digit:]]+\\.*[[:digit:]]*")))
# print(point1)
# for(j in 1:length(severe@data$points)) {
# point2 <- as.double(unlist(str_extract_all(severe@data$points[j], #"[[:digit:]]+\\.*[[:digit:]]*")))
# print(point2)
# distances[i, j] <- distCosine(point1,point2)
# print(distances[i, j])
# }
#}
#write.csv(as.data.frame(distances), file = "distances.csv")
ggplot(severe@data, aes(x = reaction_time, y = station_distance)) +
geom_point() +
scale_x_time(limits = c("00:00:00","00:40:00")) +
labs(title = "Distance does not equal reaction time for the fire department", subtitle = "No clear relationship between reaction time and distance to station", x = "Reaction time (minutes)", y = "Distance to station (feet)") +
theme_few()
## Warning in structure(as.numeric(x), names = names(x)): NAs introduced by
## coercion
Provide a map visualization of response times. Feel free to differentiate by incident type / property affected etc. if that is interesting.
pal_distance <- colorNumeric(
palette = "Reds",
domain = severe$station_distance)
pal_time <- colorNumeric(
palette = "Reds",
domain = as.double(severe$reaction_time))
map_reaction_times <- leaflet(severe) %>%
setView(lng = -74.0060, lat = 40.7128, zoom = 11) %>%
addProviderTiles(providers$CartoDB.PositronNoLabels) %>%
addCircleMarkers(color = ~pal_time(as.double(reaction_time)),
radius = ~reaction_time/100,
label = ~htmlEscape(as.character(PROPERTY_USE_DESC)),
group = "Reaction time",
popup = paste("Property use:",severe$PROPERTY_USE_DESC,"<br/>",
"Borough:", severe$BOROUGH_DESC,"<br/>",
"Date and time:", severe$INCIDENT_DATE_TIME)) %>%
addCircleMarkers(color = ~pal_distance(station_distance),
radius = ~station_distance/500,
label = ~htmlEscape(as.character(PROPERTY_USE_DESC)),
group = "Distance",
popup = paste("Property use:",severe$PROPERTY_USE_DESC,"<br/>",
"Borough:", severe$BOROUGH_DESC,"<br/>",
"Date and time:", severe$INCIDENT_DATE_TIME)) %>%
addLayersControl(
baseGroups = c("Reaction time", "Distance"),
options = layersControlOptions(collapsed = FALSE))
map_reaction_times
communities <- readOGR("nycd_18a/nycd.shp")
## OGR data source with driver: ESRI Shapefile
## Source: "nycd_18a/nycd.shp", layer: "nycd"
## with 71 features
## It has 3 fields
zip_codes <- readOGR("ZIP_CODE_040114/ZIP_CODE_040114.shp")
## OGR data source with driver: ESRI Shapefile
## Source: "ZIP_CODE_040114/ZIP_CODE_040114.shp", layer: "ZIP_CODE_040114"
## with 263 features
## It has 12 fields
zip_codesWGS84 <- spTransform(zip_codes, CRS("+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0"))
severe_zip <- severe@data %>%
group_by(ZIP_CODE) %>%
summarise(mean_reaction_time = mean(reaction_time),
mean_station_distance = mean(station_distance))
zip_codesWGS84 <- merge(zip_codesWGS84, severe_zip, by.x = "ZIPCODE", by.y = "ZIP_CODE")
time_distance_zipcode <- leaflet(zip_codesWGS84) %>%
setView(lng = -74.0060, lat = 40.7128, zoom = 11) %>%
addProviderTiles(providers$CartoDB.PositronNoLabels) %>%
addPolygons(color = "#444444", weight = 1, smoothFactor = 0.5,
opacity = 1.0, fillOpacity = 0.9,
fillColor = ~colorQuantile("Reds", mean_reaction_time)(mean_reaction_time),
highlightOptions = highlightOptions(color = "white", weight = 2,
bringToFront = TRUE),
group = "Time",
label = ~htmlEscape(as.character(mean_reaction_time))) %>%
addPolygons(color = "#444444", weight = 1, smoothFactor = 0.5,
opacity = 1.0, fillOpacity = 0.9,
fillColor = ~colorQuantile("Reds", mean_station_distance)(mean_station_distance),
highlightOptions = highlightOptions(color = "white", weight = 2,
bringToFront = TRUE),
group = "Distance",
label = ~htmlEscape(as.character(mean_station_distance))) %>%
addLayersControl(
baseGroups = c("Time", "Distance"),
options = layersControlOptions(collapsed = FALSE)
)
time_distance_zipcode
Please follow the instructions to submit your homework. The homework is due on Wednesday, March 21.
If you do come across something online that provides part of the analysis / code etc., please no wholesale copying of other ideas. We are trying to evaluate your abilities to visualized data not the ability to do internet searches. Also, this is an individually assigned exercise – please keep your solution to yourself.